8 research outputs found

    Virtual Conductor for String Quartet Practice

    Get PDF
    This paper presents a system that emulates an ensemble conductor for string quartets. This application has been developed as a support tool for individual and group practice, so that users of any age range can use it to further hone their skills, both for regular musicians and students alike. The virtual conductor designed can offer similar indications to those given by a real ensemble conductor to potential users regarding beat times, dynamics, etc. The application developed allows the user to rehearse his/her performance without the need of having an actual conductor present, and also gives access to additional tools to further support the learning/practice process, such as a tuner or a melody evaluator. The system developed also allows for both solo practice and group practice. A set of tests were conducted to check the usefulness of the application as a practice support tool. A group of musicians from the Chamber Orchestra of Malaga including an ensemble conductor tested the system, and reported to have found it a very useful tool within an educational environment and that it helps to address the lack of this kind of educational tools in a self-learning environment.This work has been funded by the Ministerio de Economia y Competitividad of the Spanish Government under Project No. TIN2010-21089-C03- 02 and Project No. IPT-2011-0885-430000 and by the Ministerio de Industria, Turismo y Comercio under Project No. TSI-090100-2011-25

    Evaluation framework for automatic singing transcription

    Get PDF
    In this paper, we analyse the evaluation strategies used in previous works on automatic singing transcription, and we present a novel, comprehensive and freely available evaluation framework for automatic singing transcription. This framework consists of a cross-annotated dataset and a set of extended evaluation measures, which are integrated in a Matlab toolbox. The presented evaluation measures are based on standard MIREX note-tracking measures, but they provide extra information about the type of errors made by the singing transcriber. Finally, a practical case of use is presented, in which the evaluation framework has been used to perform a comparison in detail of several state-of-the-art singing transcribers.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech. This work has been funded by the Ministerio de Economía y Competitividad of the Spanish Government under Project No. TIN2013-47276-C6-2-R and by the Junta de Andalucía under Project No. P11-TIC-7154

    Rhythm-Based Video Game Assessment.

    Get PDF
    Humans have shown a natural tendency to move or adapt, intentionally or unintentionally, to the beat of rhythmic auditory stimuli, such as music. This activity is sustained by a complex neuronal network, including perceptual regions, motor regions and sensorimotor integration areas. These abilities can be trained by practising, in this context music-based video games are a great tool to improve those rhythmic skills, like hand-eye coordination or synchronization. An important aspect of this tool is the feedback the players get after playing, so they know what they do right and what they do wrong. Thus, feedback plays a leading role in player’s improvement. The aim of this article is to develop an assessment scheme for a rhythm-based video game to help the users improve their rhythmic skills by playing. This scheme shows information like hit percentage, accuracy or the tendency to make early or late hits. In this way, only a few values provide users with a lot of feedback.This publication is part of the project PDC2021-120997-C33 funded by MCIN/AEI/10.13039/501100011033, and European Union “NextGeneration EU/PRTR”. This publication is part of the project PID2021-123207NB-I00 funded by MCIN/AEI/10.13039/501100011033/FEDER, UE. This work was done at Universidad de Malaga, Campus de Excelencia Internacional Andalucía Tech. Special thanks to Francisco Vargas Boza (Programmer), Juan Carlos Camuna Cotta (3D Modeler) and Isabel Tardón (2D Character designer) for their contribution to the development of the video game ‘Hammersong’

    Enhanced average for event-related potential analysis using dynamic time warping

    Get PDF
    Electroencephalography (EEG) provides a way to understand, and evaluate neurotransmission. In this context, time-locked EEG activity or event-related potentials (ERPs) are often used to capture neural activity related to specific mental processes. Normally, they are considered on the basis of averages across a number of trials. However, there exist notable variability in latency jitter, jitter, and amplitude, across trials, and, also, across users; this causes the average ERP waveform to blur, and, furthermore, diminish the amplitude of underlying waves. For these reasons, a strategy is proposed for obtaining ERP waveforms based on dynamic time warping (DTW) to adapt, and adjust individual trials to the averaged ERP, previously calculated, to build an enhanced average by making use of these warped signals. At the sight of the experiments carried out on the behaviour of the proposed scheme using publicly available datasets, this strategy reduces the attenuation in amplitude of ERP components thanks to the reduction of the influence of variability of latency and jitter, and, thus, improves the averaged ERP waveforms.This publication is part of project PID2021-123207NB-I00, funded by MCIN/AEI /10.13039/501100011033 / FEDER, UE. This work was partially funded by Junta de Andalucía, Proyectos de I+D+i, in the framework of Plan Andaluz de Investigación, Desarrollo e Innovación (PAIDI 2020), under Project No. PY20_00237. Funding for open access charge: Universidad de Málaga/CBUA. This work was done at Universidad de Málaga, Campus de Excelencia Internacional Andalucia Tech

    Energy-based features and bi-LSTM neural network for EEG-based music and voice classification.

    Get PDF
    The human brain receives stimuli in multiple ways; among them, audio constitutes an important source of relevant stimuli for the brain regarding communication, amusement, warning, etc. In this context, the aim of this manuscript is to advance in the classification of brain responses to music of diverse genres and to sounds of different nature: speech and music. For this purpose, two different experiments have been designed to acquire EEG signals from subjects listening to songs of different musical genres and sentences in various languages. With this, a novel scheme is proposed to characterize brain signals for their classification; this scheme is based on the construction of a feature matrix built on relations between energy measured at the different EEG channels and the usage of a bi-LSTM neural network. With the data obtained, evaluations regarding EEG-based classification between speech and music, different musical genres, and whether the subject likes the song listened to or not are carried out. The experiments unveil satisfactory performance to the proposed scheme. The results obtained for binary audio type classification attain 98.66% of success. In multi-class classification between 4 musical genres, the accuracy attained is 61.59%, and results for binary classification of musical taste rise to 96.96%.Funding for open access charge: Universidad de Málaga / CBU

    Framework for automatic singing transcription: Database and Software.

    No full text
    This Research data consists of a cross-annotated dataset of 1154seconds and a novel set of evaluation measures, able to report thetype of errors made by the system. Both the dataset, and a Matlabtoolbox including the presented evaluation measures, are freely available. In order to show the utility of the work presented in thispaper, we have performed an detailed comparative study of threestate-of-the-art singing transcribers plus a baseline method, lead-ing to relevant information about the performance of each method.This work has been funded by the Ministerio de Economía y Competitividad of the Spanish Government under Project No.TIN2013-47276-C6-2-R and by the Junta de Andalucía underProject No. P11-TIC-7154. The work has been done at Universidadde Málaga. Campus de Excelencia Internacional Andalucía Tech
    corecore